$$ \begin{align} \mathbb{E}(u) && = && \mathbb{P}[u=0] \cdot 0 + \mathbb{P}[u=1] \cdot 1 \\ && = && \frac{1}{2} \cdot 0 + \frac{1}{2} \cdot 1 \\ && = && 0 + \frac{1}{2} \\ && = && \frac{1}{2} \\ \end{align} $$

Therefore $$ \begin{align} \mathbb{E}(u) + 0 && < && \mathbb{E}(u) + \epsilon && < && \mathbb{E}(u) + \frac{1}{2} \\ \frac{1}{2} + 0 && < && \mathbb{E}(u) + \epsilon && < && \frac{1}{2} + \frac{1}{2} \\ \frac{1}{2} && < && \mathbb{E}(u) + \epsilon && < && 1 \\ \Rightarrow 0 && < && \mathbb{E}(u) + \epsilon && < && 1 \\ \end{align} $$

So the results in (c) still apply by setting $\alpha = \mathbb{E}(u) + \epsilon$.

In (c), we found $e^{-s\alpha}U(s)$ to be minimum at $s=\ln{\frac{\alpha}{1-\alpha}}$. We want to keep it that way, so that we can minimize $\mathbb{P}[u \geq \alpha ]$:

$$ \begin{align} V(s) && = && \frac{1}{2}(\frac{1}{e^{s\alpha}}+\frac{e^s}{e^{s\alpha}}) \\ && = && \frac{1}{2}(\frac{1}{e^{\alpha\ln{\frac{\alpha}{1-\alpha}}}}+\frac{e^{\ln{\frac{\alpha}{1-\alpha}}}}{e^{\alpha\ln{\frac{\alpha}{1-\alpha}}}}) \\ && = && \frac{1}{2}( \frac {1} { (e^{\ln{ \frac{\alpha}{1-\alpha} }})^\alpha } + \frac {e^{\ln{ \frac{\alpha}{1-\alpha} }}} { (e^{\ln{ \frac{\alpha}{1-\alpha} }})^\alpha } ) \\ && = && \frac{1}{2}( \frac {1} { (\frac{\alpha}{1-\alpha})^\alpha } + \frac {\frac{\alpha}{1-\alpha}} { (\frac{\alpha}{1-\alpha})^\alpha } ) \\ && = && \frac{1}{2}( (\frac{1-\alpha}{\alpha})^\alpha + \frac {\alpha(1-\alpha)^\alpha} {(1-\alpha)\alpha^\alpha} ) \\ && = && \frac{1}{2}( (1+\frac{\alpha}{1-\alpha})(\frac{1-\alpha}{\alpha})^\alpha ) \\ && = && \frac{1}{2}( (\frac{1-\alpha}{1-\alpha}+\frac{\alpha}{1-\alpha})(\frac{1-\alpha}{\alpha})^\alpha ) \\ && = && \frac{1}{2}( \frac{1}{1-\alpha}(\frac{1-\alpha}{\alpha})^\alpha ) \end{align} $$

Now we replace $\alpha$ with $\mathbb{E}(u)+\epsilon$:

$$ \begin{align} V(s) && = && 2^{-1}(\frac{1}{1-(\mathbb{E}(u)+\epsilon)}(\frac{1-(\mathbb{E}(u)+\epsilon)}{\mathbb{E}(u)+\epsilon})^{\mathbb{E}(u)+\epsilon}) \\ && = && 2^{-1}(\frac{1}{1-(\frac{1}{2}+\epsilon)}(\frac{1-(\frac{1}{2}+\epsilon)}{\frac{1}{2}+\epsilon})^{\frac{1}{2}+\epsilon}) \\ && = && 2^{-1}(\frac{1}{\frac{1}{2}-\epsilon}(\frac{\frac{1}{2}-\epsilon}{\frac{1}{2}+\epsilon})^{\frac{1}{2}+\epsilon}) \\ && = && 2^{-1} ( 2^{-log_2{(\frac{1}{2}-\epsilon)}} (2^{log_2{(\frac{1}{2}-\epsilon)}-log_2{(\frac{1}{2}+\epsilon)}}) ^{\frac{1}{2}+\epsilon} ) \\ && = && 2^{-1} ( 2^{-log_2{(\frac{1}{2}-\epsilon)}} (2^{(\frac{1}{2}+\epsilon)log_2{(\frac{1}{2}-\epsilon)}-(\frac{1}{2}+\epsilon)log_2{(\frac{1}{2}+\epsilon)}}) ) \\ && = && 2^{-1} ( 2^{ -log_2{(\frac{1}{2}-\epsilon)} + (\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} - \epsilon)} - (\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} + \epsilon)} } ) \\ && = && 2^{-1} ( 2^{ (\frac{1}{2}+\epsilon - 1)log_2{(\frac{1}{2} - \epsilon)} - (\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} + \epsilon)} } ) \\ && = && 2^{-1} ( 2^{ (-\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} - \epsilon)} - (\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} + \epsilon)} } ) \\ && = && 2^{-1} ( 2^{ -(\frac{1}{2}-\epsilon)log_2{(\frac{1}{2} - \epsilon)} - (\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} + \epsilon)} } ) \\ && = && 2^{-1 -(\frac{1}{2}-\epsilon)log_2{(\frac{1}{2} - \epsilon)} - (\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} + \epsilon)} } \\ && = && 2^{-(1 +(\frac{1}{2}-\epsilon)log_2{(\frac{1}{2} - \epsilon)} + (\frac{1}{2}+\epsilon)log_2{(\frac{1}{2} + \epsilon)} )} \\ && = && 2^{-\beta} \end{align} $$

And so

$$ \begin{align} \mathbb{P}[u \geq \mathbb{E}(u) + \epsilon] && \leq && V(s)^N \\ && \leq && (2^{-\beta})^N \\ && \leq && 2^{-\beta{}N} \end{align} $$

Now we need need to demonstrate that $\beta \gt 0$.

Proving $\beta > 0$ for $0 < \epsilon < \frac{1}{2}$

First we take the first and second derivatives.

Remember that $\beta = 1 + (\frac{1}{2}-\epsilon)\log_2{(\frac{1}{2}-\epsilon)} + (\frac{1}{2}+\epsilon)\log_2{(\frac{1}{2}+\epsilon)}$.

So $$ \begin{align} \beta' && = && (\frac{1}{2}-\epsilon)(\frac{1}{(\frac{1}{2}-\epsilon)\ln{(2)}} \cdot -1) + -1 \cdot \log_2{(\frac{1}{2}-\epsilon)} + (\frac{1}{2}+\epsilon)(\frac{1}{(\frac{1}{2}+\epsilon)\ln{(2)}} \cdot 1) + 1 \cdot \log_2{(\frac{1}{2}+\epsilon)} \\ && = && -\frac{1}{\ln{(2)}} - \log_2{(\frac{1}{2} - \epsilon)} + \frac{1}{\ln{(2)}} + \log_2{(\frac{1}{2} + \epsilon)} \\ && = && -\log_2{(\frac{1}{2} - \epsilon)} + \log_2{(\frac{1}{2} + \epsilon)} \\ \\ \beta'' && = && -\frac{1}{(\frac{1}{2}-\epsilon)\ln{(2)}} \cdot -1 + \frac{1}{(\frac{1}{2}+\epsilon)\ln{(2)}} \cdot 1 \\ && = && \frac{1}{(\frac{1}{2}-\epsilon)\ln{(2)}} + \frac{1}{(\frac{1}{2}+\epsilon)\ln{(2)}} \end{align} $$

Because $0 < \epsilon < \frac{1}{2}$: $\frac{1}{2} < \frac{1}{2} + \epsilon < 1$ and $0 < \frac{1}{2} - \epsilon < \frac{1}{2}$

And $\ln{(2)} \approx 0.693$.

Therefore $\beta'' > 0$ for $0 < \epsilon < \frac{1}{2}$, which means $\beta$ is a convex function and has a single minimum.

Now let's find the minimum: $$ \beta' = 0 \\ \Rightarrow -\log_2{(\frac{1}{2} - \epsilon)} + \log_2{(\frac{1}{2} + \epsilon)} = 0 \\ \Rightarrow \epsilon = 0 $$

At $\epsilon = 0$: $$ \begin{align} \beta && = && 1 + (\frac{1}{2})\log_2{(\frac{1}{2})} + (\frac{1}{2})\log_2{(\frac{1}{2})} \\ && = && 1 - \frac{1}{2} - \frac{1}{2} \\ && = && 0 \end{align} $$

Therefore, $0 < \beta$ for $0 < \epsilon < \frac{1}{2}$.

Hence the bound $\mathbb{P}[u \geq \mathbb{E}(u) + \epsilon] \leq 2^{-\beta N}$ is exponentially decreasing in $N$.

NOTE Using Jensen's inequality to prove this part is supposed to be simpler, but I just don't get Jensen's inequality.